perm filename LECT.SCZ[P,JRA] blob
sn#128584 filedate 1974-11-06 generic text, type C, neo UTF8
COMMENT ⊗ VALID 00007 PAGES
C REC PAGE DESCRIPTION
C00001 00001
C00002 00002 LECT 1
C00006 00003 LECT 2: evaluation
C00009 00004 LECT 3:implementation --static
C00012 00005 LECT 4: compilation and dynamics
C00017 00006 LECT 5: reflections...In a pig's eye!!!
C00022 00007 organized bull
C00027 ENDMK
C⊗;
LECT 1
survey of highlights in 4 hours.
mechanics basics domain+ functions(+ funargs+efficiency(progs+rplaca|d))
evaluation interpretation+Scott
implementation machines+g.c.+ binding strategies+symboltables
compilation techniques+debugging+editors+..
implications how to design better languages asts. data stuct
theory proofs of progs,correctness
applications ai, thm. provers+verifiers
here, do 1 and 2; not sure about what for three yet
BASICS
formal language
domain:symbolic expressions
operaations
domain atomic
non-atomic:dotted pairs
atoms
literal atoms=uppercase identifiers
numbers (uninteresting)
non-atomic: dotted pairs
examples.
interpretation as binary trees; easy.
functions:priimitive
cons[x;y] total;forms dottedpair; called constructor
car,cdr partial; first or second; called selectors.
composition allowed; abbreviation of car-cdr chains
informal defintion facility: <=
not very exciting so far. need conditional
[p1→e1; ....] semantics
obviously need predicates;
LISP maps true to T and false to NIL thus predicates have values in domain
not truth values as normally understood in math; thus can compose
o.w. condition
two primitive: atom and eq
atom total
eq only for atoms
example:fact
equal
(subst)
list notation: subset of domain of sexprs useful for representing sequences
anticdote on ",".
constructor: list
selectors: cad(nth)r
predicate: null
example length
append
fact1
reverse
do length1
relation between append and reverse.
do example of differentiation.
there's a section on tricks in notes, but there's a technique
which is universally applicable in LISP funct. writing which
wish to exemplfy.
importance
shows intuitive alg which is naturally recursive
shows mapping of intuition to pro language
shoe lisp power
will show how to take into abstract algorithm
mapp predicates,or recognizers, then constructors, then selectors
DO tgmoaf
LECT 2: evaluation
review of deriv and representation
intuitive sketch
constants
vars and environment
function calls: decision cbv and l-r
how do we find name
conditionals
example rev1[x,y] <= [null[x] → y; T → rev1[cdr[x];cons[car[x];y]]]
for rev1[(A B C)]
note recursiveness of evaluation
cf. diff example
look at tgmoaf
do mapping of LISP expressions≡forms to sexprs
start eval
question: how to find vars and functions
symb. tabs and assoc for lookup; cons for entry
form-valued vars.
start apply
how isfun recognizes: (FN ...) but compare (COND ...)
special forms: add evcond
var is function; what is f[x;y]: compare cons[A;B] and cons [x;y]
use f[x;y] <= x+y
what about founctions? the λ-calculus
λ-calc
terminology λ-vars,λ-list, body
syntax-pragmatics-semantics
<= looks like assignment? or does it?
DBA
finish basic eval-apply
what about defns: label
label works as assignment in LISP, but ok.
problems of globals
disallow; lose very big.
take previous val;lose:fact
take value when applied;lose funarg
fact-foo DBA example again
x=1 vs. x←1
= as predicate and as equation x=x↑2 + 6
fixpoint
functions get messy: get notation: weizenbbaum B-W.
functional args: what does it mean? foo[x,y] <= x[y]
rep as sexpr
functional args passed in
functions as values
but label is good for lisp
adding functions
adding other things: special forms; and[x1, xn]
point to implementations and p-lists
scott: parallel sit to λ-calc 'til 1969
for lisp
why: over-specify
precision
therefore mathematics available for proofs of properties
on (yecch) machine
define and evalquote
LECT 3:implementation --static
rewiew eval with implementation in mind
eval constants and conditionals
apply priitives, lambdas and fn names
question why not handle cond inside apply
sppecial forms: another example, "and"
two kinds of evaluation--extend to user
another problem: definitions go away eval[e, env]
two solutions start with BIG env'; or hand code then into eval and apply
bindings
globals: dynamic binding; look at structure of alist
can't outlaw; c.f. fact; different solution: fix-point
functional args
coming in
going out
implementation
funarg hack: lisp was first
(FUNARG fn st)
do weizen examples
implementation
separate out symbol table: eval[e]
different properties expr, fexpr
efficient
box notation NIL and atom header
properties: subr fsubr expr fexpr value
attribute-val pairs
atom header thus atom[x]
search: GET; add PUTPROP
unique atoms (via read) thus eq[x;y]
what about printing (A.B) thus PNAME
worry about bindign after we get machine set up.
what about cons: free space and full word space
gc: base pointers: symbol table and partial comp
basic structure of lisp mchinne; hs. about inst and data
read eval print loop
details or read-ratom , parser,scanner
hashing, oblist,buckets ***if time***
ratom-unique atom
do (CONS (QUOTE A)(QUOTE B))
now finally, bindings
in apply
islambda => save old, bind new, eval body, restore old, buzz off
recall assoc
look at value cell...special push-down , mscw
advantages, fast save and restore
disadvantages, funargs
recall old
now at defn save old sp
at apply restore to state at old sp, while savving current
bind locals and go, at end, rebind flushing to st at entry.
sigh
bootstapping
enxt time compilers, and machines
wed??? implicarrions and scott
LECT 4: compilation and dynamics
**************************
*****tell about problem on p149.******
put evlis and evcon on the board
***************************
REVIEW
p-lists
values-cell and binding
funargs
structure of new eval.. notice now look first at fn before args
indivators expr fexpr value macro
do "and"
compilers, machines and implementation of recursion
what is it: relation to interpretation; nothing conceptual
why compile: speed + extra sexpr space..conceptually no better
bootstrapping
structure compiler: generators
machine to machine
lang to extension
binary program space (hexidecimal)
LISP comilers don't hack syntax
compilers in tgm-steps
functions composition and constant args
refer to evlis
calling conventions: stack+ac1-acn
value returned convention
comp,push, comp, push ..... loadac[m]
do it; MUST since crucial to following var arg!!
integrity of stack
add conditionals
refer to evcond
linearize cond tree
start comcond[((p1 e1) ...(pn en))]
convention on truth,jumpe;gensym
write comcond; and add recognizer in compexp
one pass assmeblers
fixups and forward refs
add local variables
consider λ[x,y,z] ....x.....y.....z
conventions says @call v(x), v(y), v(z) in ac1,2,3; save 'em;
partial comp changes t.o.s.
who? complis.
good example j[x y] <= f[g[x];h[y]]
convention: ...-n P)
offset + rel pos
lookup is offset+cdr[assoc...]
conventions: λ[x y z] => prup, PUSH, PUSH, ...
clean-up stack and POPJ
add globals shallow and deep
stack won't work as is: only value is saved on stack;special cell
deep as in interlisp; save name-value; like a-list; compiler can be smart
stupid interpreter; stupid interlisp
compiling other parts of lisp: fexprs, macros. functional args. progs.
debuggers and editors
tracing: hide defn; replace with tracer; similar monitor on assignment
what should LISP have:
documentation: damn tired of people reinventing McCarthy's ideas
user-defined ds.
structuring s.t. not only an impl. in the lang but THE impl in the lang
sequences, structures, pointers
abstract control
better closures
form vars
LECT 5: reflections...In a pig's eye!!!
to: pdp-11 hackers: look at LISP as if you were bootstrappint to 11
don't look at all the details.
reflections
remarks on short coourse and teaching
students, particularly grad students should be stired up
teachers are not dispensers of wisdom in this field
structured programming
progrm is not, programiing should be.
structuted apl(NO WAY), cf siftup (BOO!)
what is interesting
syntax isn't
compilation isn't
compilers, syntax analysis and pl1 have done much to retart c.s.
semantics is
structure of language system
problems in programming are not in efficiency but in correctness!!!!!!!
people gasp at represetnation of lisp-like inefficiency, but are
conditioned to long programming development and debugging time, buggy
programs and crap. even if machines were made ten-times faster
debugginh problem would not go away,(but LISp interpretation
would be acceptable)
required: n. nec fol proofs but proofs with "conviction".
COMPARE AI AND THEORY: mismatch between titles and content of papers
ai: program which plays chess => 4x4 board with two pieces
theory: program verification system for pascal-lisp-pl1
all languages are bad: some are worse than others
waht is lisp-like lang; why is it better/different.
why is LISP still around? very strange reasons given...
apg: 1)formal; 2) guess what i'm thinking
what is feasible, desireable, useful...
not mind reader
not knowledge of programming area
programs which try to be smart lose very big.
know about programming, types, ...
competent assitant is reasonable
semantics
operational
axiomatic
denotational
language: syntax pragmatics and semantics
benefits of denotational modeling
precision => provability
don't overspecify cf. order in cond
correctness of compilers and evaluation
complier correctness is different from prog correctness
equivalence-Boyer Moore
manipulation: darlington
lcf and Scott
cf. axions+ corrrectness, completeness proofs
semantic equations, existence of model
mal
correctness of compilers and interpreters for lisp subsets
axioms for fromal manipulation of lisp progrs.
mg
scott for lisp
semantic equations
model
rules of imference
tennent
good popularizer, QUEST
milne
attempts to analyze "realistic" impple. inreadible
wadsworth
analysis of λ-cal
implications
el1 and s-lisp
planner
calling by pattern
data bases
conniver
add -- if-added
remove -- if-removed
fetch -- if-needed try-next and possibilities lists
make prog look like data
control and access environs
adieu and au-revoir
contexts for d.b.'s
actors and control???
list of papers to read
humble programmer
three computer cultures
authors, wegner, scott, reynolds,
organized bull
comments on teaching short course
high level vs. details --picked h.l. because that's interesting
lack of time for questions -- very bad
cf. (F A1 A2 ... AN)
1) ignore
2) "not the way it's done anyway
missed parts
efficiency applications a-i languages
what is wrong with lisp
1) NOT syntax
2) closures not done well
3) d.s. weak
a) for user
b) for LISP i.e. implementation of THE interpreter ont AN int
what is lisp-like lang
1) interpreter based
2) which works on NATURAL rep of prog.
clearly other languages can effect the map-- all are disgustingly the same.
but lisp did it interntionally
why it's good for you
software problem: get programs which work. demonstrably correct
problem is conceptual not technological
l.s system
do NOT want "smart" system. pain in ass
do NOT want "mind-reader" system. pain in ass
invaribly "stupid".
structuring of dtructured programming occurs in construction
cf. siftup
what is structured programming. (sic)ing.
easier to say what it is NOT. not block structure, not APL
system components language; command language to manipualtee progs
editor debugger, verifier, interpreter, compiler
how to make lisp better
user defined data stuctures
what is d.s.? dunno. again what it is not c.f. actors and DCL
more that stupid program...conceptually different
c.f. everything is atom and driving to LA.
anicdote on CDR and DBA
like type...abstract syntax
abstract control....cf. set var
operations must map somehow to ppl.
system must be able to manipulate programs to tell if correct.
semantics
axiomatic
axioms and rules of inference, reduction, simplification rules
hoare
mg for LISP
λ-calc
order of evaluation built into LISP but not into λ-calc
normal forms in λ-cal
a) not everything has fn., but if it does outside-in cbn gets it
b) things with distinct nf are distinct ∃ args st. applying gets
distinct objs
c) not true for non nf. ∃ distinct exprs which under natural interpretation
are same...how do you prove such?
operational
in terms of machine
TM ...boo
compiler...boo
vdl...sign
lisp ....
denotational
langueage = syntax+ pragmatics+semantics
cf. λ-calc
proc is just math function
mg.
denotational model for lisp
reduction rules for lisp
red(<e,a>) =A <=> e(a)=A i.e. e is expression. it is a function which when
applied to arg ( env) gives value(denotaion)
also eval(e*,a*) = e(a)